skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Passonneau, Rebecca"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available July 27, 2026
  2. Free, publicly-accessible full text available June 10, 2026
  3. Free, publicly-accessible full text available June 10, 2026
  4. Writing scientific explanations is a core practice in science. However, students find it difficult to write coherent scientific explanations. Additionally, teachers find it challenging to provide real-time feedback on students’ essays. In this study, we discuss how PyrEval, an NLP technology, was used to automatically assess students’ essays and provide feedback. We found that students explained more key ideas in their essays after the automated assessment and feedback. However, there were issues with the automated assessments as well as students’ understanding of the feedback and revising their essays. 
    more » « less
  5. To assess student knowledge, educators face a tradeoff between open-ended versus fixed response questions. Open-ended questions are easier to formulate, and provide greater insight into student learning,vbut are burdensome. Machine learning methods that could reduce the assessment burden also have a cost, given that large datasets of reliably assessed examples (labeled data) are required for training and testing. We address the human costs of assessment and data labeling using selective prediction, where the output of a machine learned model is used when the model makes a confident decision, but otherwise the model defers to a human decision-maker. The goal is to defer less often while maintaining human assessment quality on the total output. We refer to the deferral criteria as a deferral policy, and we show it is possible to learn when to defer. We first trained an autograder on a combination of historical data and a small amount of newly labeled data, achieving moderate performance. We then used the autograder output as input to a logistic regression to learn when to defer. The learned logistic regression equation constitutes a deferral policy. Tests of the selective prediction method on a held out test set showed that human-level assessment quality can be achieved with a major reduction of human effort. 
    more » « less
  6. Building causal knowledge is critical to science learning and scientific explanations that require one to understand the how and why of a phenomenon. In the present study, we focused on writing about the how and why of a phenomenon. We used natural language processing (NLP) to provide automated feedback on middle school students’ writing about an underlying principle (the law of conservation of energy) and its related concepts. We report the role of understanding the underlying principle in writing based on NLP-generated feedback. 
    more » « less
  7. In principle, educators can use writing to scaffold students’ understanding of increasingly complex science ideas. In practice, formative assessment of students’ science writing is very labor intensive. We present PyrEval+CR, an automated tool for formative assessment of middle school students’ science essays. It identifies each idea in a student’s science essay, and its importance in the curriculum. 
    more » « less